Edward

Contents

  • Introduction
  • Edward: A library for probabilistic modeling, inference, and criticism
  • Deep Probabilitic Programming (Related to Deep Generative Model)

Introduction

  • Edward supports a broad class of probabilistic models, efficient algorithms for inference, and many techniques for model criticism
    • Named after the statistician George Edward Pelham Box
  • For modeling
    • Directed graphical models
    • Stochastic neural networks
    • Programs with stochastic control flow (?)
  • For inference, Edward provides algorithms
    • Stochastic and black box variational inference
    • Hamiltonian Monte Carlo
    • Stochastic gradient Langevin dynamics
    • Infrastructure to make it easy to develop new algorithms.
  • For criticism,
    • Scoring rules
    • Predictive checks
  • Built on top of TensorFlow to support distributed training and hardware such as GPUs

Why Edward?

We are interested in deploying probabilistic models to many real world applications, ranging from the size of data and data structure, such as large text corpora or many brief audio signals, to the size of model and class of models, such as small nonparametric models or "DEEP GENERATIVE MODELS".
  • Various Inference
  • Computational Frameworks
  • High-level Deep Learning Libraries
    • Similar to Keras
    • We can use Keras and Edward together.

References

Presentation - Edward: A library for probabilistic modeling, inference, and criticism

SimpleExample

. . . .

Model & Composing Random Variables

Directed Graphical Model

Neural Network

Inference

Composing Random Variables

Hybrid Algorithms

Message Passing

Deep Probabilistic Programming

  • Compositionality
    • The nature of deep neural networks is compositional.
    • Users can connect layers in creative ways, without having to worry about how to perform testing or inference.
      • testing (forward propagation)
      • inference (gradient- based optimization, with back propagation and automatic differentiation)
  • Compositional representations for probabilistic programming
    • Random variables
    • Inference
  • Probabilistic programming lets users
    • Specify generative probabilisticmodels as programs
    • “compile” those models down into inference procedures.
  • We propose Edward, a new Turing-complete probabilistic programming language

    • Builds on two compositional representations (random variables and inference).
  • Aim to design compositional representations for probabilistic programming.

    • Has focused on how to build rich probabilistic programs by composing random variables
    • Such systems cannot capture recent advances in probabilistic modeling such as in variational inference
    • Analogous compositionality for inference.

VAE

Bayesian RNN

GAN


In [ ]: